05. Introduction to Identifying Hazards

Introduction To Identify Hazards

Examples of human, technology, and human-technology interaction errors

Let's go through a real-world example of each type of error and see how they lead to accidents.

Human Error

In February 2016, two trains collided head-on near Munich, Germany. Although that stretch of track had an automatic signaling system that should have stopped the trains, a human controller switched off the system because one of the trains was running late. This is an accident due to human error. As an engineer, you would try to figure out why the train controller turned off the system and what could have been done to avoid the situation.

Human error is especially important in the auto industry. In the United States, the National Highway and Transportation Safety Administration estimates that 94% of car accidents involve human error.

The European Union also estimates about 90% of vehicular accidents
involve human error.

Technology Error

For an example of technology causing accidents, consider the European Space Agency's Ariane 5 rocket . The rocket exploded during its first test flight on June 4th, 1996. Engineers had reused portions of the Ariane 4 software. The rocket exploded because the Ariane 5 required a 64-bit floating point value whereas the original software expected a 16-bit signed integer.

Software bugs are a major source of technology error throughout many industries including the car industry. An internet search of software bugs cars will give you a sense of how serious the issue is.

Human-Technology Interaction Error

And finally, let's discuss how human-technology interaction causes accidents. Back in 2013, Asiana Flight 214 crashed when landing in San Francisco. A pilot selected an incorrect autopilot mode and inadvertently switched off the auto throttle function. The plane came in for landing at a very low speed and low altitude.

The pilots' over-reliance and misunderstanding of the automatic system contributed to the crash. The autopilot system should have included a warning telling the pilot that the automatic throttle was not maintaining enough speed.

Whenever a human and a machine share control of a system, extra care needs to be taken when evaluating safety; the boundary between what the human is supposed to do and what the machine is supposed to do needs to be clear.

As advanced driver-assistance systems (ADAS) become ubiquitous, human-technology interaction errors might become more prevalent. We as drivers need to understand the warning signals coming from our lane keeping assistance and automatic cruise control systems. Interfaces need to make it clear when the vehicle expects us to take over. And these needs to be analyzed as safety issues. The Asiana Flight 214 example we mentioned shows the consequences of a flawed human-technology system design.

Part of your job as a safety engineer will be to anticipate what will go wrong with a vehicle when the vehicle is still in the design phase. You will think about situations and conditions where humans, technology, and human-technology interaction can lead to dangerous situations and cause accidents. In other words, you will identify hazards.

Errors and Self-Driving Cars

One of the aims of self-driving cars is to take humans completely out of the equation. The trade-off is that autonomous vehicle could introduce more technology errors.

Take a machine learning algorithm as an example. What happens if a pedestrian training set does not include pedestrians in wheelchairs? The system would not count a pedestrian in a wheelchair. How accurate do our results need to be on a validation set? Who determines what the training set needs to contain?

Autonomous vehicle technology is so new that standards like ISO 26262 do not yet even consider certain issues related to self-driving cars such as machine learning algorithms. But the principles of functional safety, which uses systems engineering to identify and lower risks, apply to autonomous vehicles as well!

Please note that the example above addresses nominal performance, which is more related to Safety of the Intended Function (SOTIF) then FuSa. The ISO 26262 committee is authoring a sister specification, ISO 21448, to address nominal performance.

Quiz

Sources of Errors

QUIZ QUESTION: :

Match the term on the left with the situation on the right

ANSWER CHOICES:



Error type

Situation

A driver goes too fast around a curve and falls off the road.

A radar based vehicle detection system is too sensitive and buzzes even when a crash is not imminent. The driver ignores warnings, which could lead to an accident if a crash were imminent.

A software bug causes the vehicle to accelerate inadvertently.

SOLUTION:

Error type

Situation

A driver goes too fast around a curve and falls off the road.

A radar based vehicle detection system is too sensitive and buzzes even when a crash is not imminent. The driver ignores warnings, which could lead to an accident if a crash were imminent.

A software bug causes the vehicle to accelerate inadvertently.